Generalized locality preserving Maxi-Min Margin Machine
نویسندگان
چکیده
Research on large margin classifiers from the "local" and "global" view has become an active topic in machine learning and pattern recognition. Inspired from the typical local and global learning machine Maxi-Min Margin Machine (M⁴) and the idea of the Locality Preserving Projections (LPP), we propose a novel large margin classifier, the Generalized Locality Preserving Maxi-Min Margin Machine (GLPM), where the within-class matrices are constructed using the labeled training points in a supervised way, and then used to build the classifier. The within-class matrices of GLPM preserve the intra-class manifold in the training sets, as well as the covariance matrices which indicate the global projection direction in the M⁴ model. Moreover, the connections among GLPM, M⁴ and LFDA are theoretically analyzed, and we show that GLPM can be considered as a generalized M⁴ machine. The GLPM is also more robust since it requires no assumption on data distribution while Gaussian data distribution is assumed in the M⁴ machine. Experiments on data sets from the machine learning repository demonstrate its advantage over M⁴ in both local and global learning performance.
منابع مشابه
Maxi-Min Margin Machine: Learning Large Margin Classifiers Locally and Globally
In this paper, we propose a novel large margin classifier, called the maxi-min margin machine M(4). This model learns the decision boundary both locally and globally. In comparison, other large margin classifiers construct separating hyperplanes only either locally or globally. For example, a state-of-the-art large margin classifier, the support vector machine (SVM), considers data only locally...
متن کاملMaxi-Min discriminant analysis via online learning
Linear Discriminant Analysis (LDA) is an important dimensionality reduction algorithm, but its performance is usually limited on multi-class data. Such limitation is incurred by the fact that LDA actually maximizes the average divergence among classes, whereby similar classes with smaller divergence tend to be merged in the subspace. To address this problem, we propose a novel dimensionality re...
متن کاملAn Intelligent Credit Forecasting System Using Supervised Nonlinear Dimensionality Reductions
Kernel classifiers (such as support vector machines) have been successfully applied in numerous areas, and have demonstrated excellent performance. However, due to the high dimensionality and nonlinear distribution of financial input data in credit rating forecasting, finding a suitable low dimensional subspace by nonlinear dimensionality reductions is a key step to improve classifier performan...
متن کاملA Novel Support Vector Machine with Globality-Locality Preserving
Support vector machine (SVM) is regarded as a powerful method for pattern classification. However, the solution of the primal optimal model of SVM is susceptible for class distribution and may result in a nonrobust solution. In order to overcome this shortcoming, an improved model, support vector machine with globality-locality preserving (GLPSVM), is proposed. It introduces globality-locality ...
متن کاملMulti-granularity distance metric learning via neighborhood granule margin maximization
Learning a distance metric from training samples is often a crucial step in machine learning and pattern recognition. Locality, compactness and consistency are considered as the key principles in distance metric learning. However, the existing metric learning methods just consider one or two of them. In this paper, we develop a multi-granularity distance learning technique. First, a new index, ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Neural networks : the official journal of the International Neural Network Society
دوره 36 شماره
صفحات -
تاریخ انتشار 2012